Equivariant deep learning: a hammer looking for a nail
Geordie Williamson (University of Sydney)
Abstract: Often one wants to learn quantities which are invariant or equivariant with respect to a group. For example, the decision as to whether there is a tiger nearby should not depend on the precise position of your head and thus this decision should be rotation invariant. Another example: quantities that appear in the analysis of point clouds often do not depend on the labelling of the points, and are therefore invariant under a large symmetric group. I will explain how to build networks which are equivariant with respect to a group action. What ensues is a fascinating interplay between group theory, representation theory and deep learning. Examples based on translations or rotations recover familiar convolutional neural nets, however the theory gives a blueprint for learning in the presence of complicated symmetry. These architectures appear very useful to mathematicians, but I am not aware of any major applications in mathematics as yet. Thus the nail of the title. Most of this talk will be a review of ideas and techniques well-known in to the geometric deep learning community. New material is joint work with Joel Gibson (Sydney) and Sebastien Racaniere (DeepMind).
machine learningmathematical physicsalgebraic geometryalgebraic topologynumber theory
Audience: researchers in the topic
DANGER2: Data, Numbers, and Geometry
| Organizers: | Alexander Kasprzyk*, Thomas Oliver, Yang-Hui He |
| *contact for this listing |
